Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting

نویسندگان

  • Xialei Liu
  • Marc Masana
  • Luis Herranz
  • Joost van de Weijer
  • Antonio M. López
  • Andrew D. Bagdanov
چکیده

In this paper we propose an approach to avoiding catastrophic forgetting in sequential task learning scenarios. Our technique is based on a network reparameterization that approximately diagonalizes the Fisher Information Matrix of the network parameters. This reparameterization takes the form of a factorized rotation of parameter space which, when used in conjunction with Elastic Weight Consolidation (which assumes a diagonal Fisher Information Matrix), leads to significantly better performance on lifelong learning of sequential tasks. Experimental results on the MNIST, CIFAR-100, CUB-200 and Stanford-40 datasets demonstrate that we significantly improve the results of standard elastic weight consolidation, and that we obtain competitive results when compared to other state-of-theart in lifelong learning without forgetting.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Consolidation in Neural Networks and in the Sleeping Brain

In this paper we explore the topic of the consolidation of information in neural network learning. One problem in particular has limited the ability of a broad range of neural networks to perform ongoing learning and consolidation. This is “catastrophic forgetting”, the tendency for new information, when it is learned, to disrupt old information. We will review and slightly extend the rehearsal...

متن کامل

Catastrophic Forgetting and the Pseudorehearsal Solution in Hopfield Networks

Most artificial neural networks suffer from the problem of catastrophic forgetting, where previously learnt information is suddenly and completely lost when new information is learnt. Memory in real neural systems does not appear to suffer from this unusual behaviour. In this thesis we discuss the problem of catastrophic forgetting in Hopfield networks, and investigate various potential solutio...

متن کامل

On Quadratic Penalties in Elastic Weight Consolidation

Elastic weight consolidation [EWC, Kirkpatrick et al., 2017] is a novel algorithm designed to safeguard against catastrophic forgetting in neural networks. EWC can be seen as an approximation to Laplace propagation [Eskin et al., 2004], and this view is consistent with the motivation given by Kirkpatrick et al. [2017]. In this note, I present an extended derivation that covers the case when the...

متن کامل

Evolving Neural Networks That Suffer Minimal Catastrophic Forgetting

Catastrophic forgetting is a well-known failing of many neural network systems whereby training on new patterns causes them to forget previously learned patterns. Humans have evolved mechanisms to minimize this problem, and in this paper we present our preliminary attempts to use simulated evolution to generate neural networks that suffer significantly less from catastrophic forgetting than tra...

متن کامل

Neural networks with a self-refreshing memory: knowledge transfer in sequential learning tasks without catastrophic forgetting

We explore a dual-network architecture with self-refreshing memory (Ans and Rousset 1997) which overcomes catastrophic forgetting in sequential learning tasks. Its principle is that new knowledge is learned along with an internally generated activity re ecting the network history. What mainly distinguishes this model from others using pseudorehearsal in feedforward multilayer networks is a rev...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1802.02950  شماره 

صفحات  -

تاریخ انتشار 2018